Project Info

Objective:
Design of a mechanism to add Z-axis motion for manual/auto focusing on a microscope using a stepper motor. Moreover a web interface allows controlling remotely the X,Y,Z steppers over the local network and provides live video streaming.

Team Members:
1.Mpantes Fotis
2.Varvagiannis Efstratios
3.Saloufas Mihalis
4.Athanasopoulos Athanasios

Team Contact:
Varvagiannis Efstratios (stratiscaster@gmail.com)

Mentor:
Tzeranis Dimitrios (tzeranis@gmail.com)

Consultants:
Polesiouk Alexander (alexanderpol3@gmail.com)

Instructions for members:
  1. Name all files you attach with a string that starts with ".
    • e.g. upload a pic of a component as Project_20152016_MicroscopeCamera_component.png"
  2. Write ALL your text in ENGLISH
  3. Use as much text as necessary.. you will be graded based on quality and clarity, not quantity
  4. Exploit the table function of wikispaces in order to organize your text and images.


The presentation of this project





A detailed report and user manual





Milestone 1: Preliminary design

Hardware Part: Design Specifications
In order to control the Z axis motion of the microscope's stage with a stepper motor we need to model the dynamic system of the microscope. We will use the in-built microscope's mechanism that is used to move the stage, so we have to couple the shaft shown in the next figure with the motor's shaft:
Project_20152016_MicroscopeCamera_1.jpg


We'll try to rotate the shaft with the use of a stepper motor manufactured by Nanotec (Model ST4209M1206 - NEMA 17). Specifications for this motor can be found on technical documentation[2].


Hardware Part: Conceptual Design
Microscope Stage Motion System Modelling
We are not interested about the microscope's transmission system and moreover we assume that the inertia of this system is negligible, but we use the transmission ratio i = 180 μm per revolution. The mass of the stage is estimated at m = 2kg. So we use the following simplified model:

Project_20152016_MicroscopeCamera_2.jpg

From the above figure we see that the equivalent distance of the stage from the axis of the rotating shaft (or the gear's radius) is calculated as follows:
1 rev -> 180 μm thus (2π)R = 180 μm --> R = 28.65 μm.

The equation that describes the systems behavior is: (J+mr2)a + mgr +Tf = Tin
where :
  • a is the angular acceleration (rad/s²)
  • J is the total inertia of the motor's shaft (JR) plus the inertia of the microscope's shaft Js.
J=Js+Jg
  • the inertial term mr² is proportional to 10-10 kgm² so is negligent
  • mgr=19.62*28.65x10-6=0.56x10-3 Nm
  • Tf is the friction loses of the system. We are not able to calculate their the exact value, but it is anticipated that they are quite small, considering that they depend on speed which will be low. A generic model like Tf = (BR +BL r)ω where BR, BL are the loses factors for the rotational and the linear motion can be applied. The exact value of these coefficients can be estimated experimentally but we can assume a value of B=0.0005 for the overall (linear+rotational) friction coefficient.

Calculation of Js
The shaft consists of a cylinder with diameter equal to 3mm and length 12mm and a second one with diameter equal to 8mm and length 22 mm (see sketch below). Assuming that the shaft’s material is steel (density = 7800 kg/m²) we can calculate the mass and therefore the inertia of the two cylinders.
  • m1= 0.012x7800(πr1²) = 0.66x10-3 kg
  • Js1= 0.5m1r12=7.4x10-10 f kgm2

  • m2= 0.022x7800(πr22 ) = 8.6x10-3 kg
    Js2= 0.5m2r22=688x10-10 kgm²

Js=695.4x10-10 kgm2 = 0,07 10-6 kgm2

It is important to note that the inertia calculated is significantly smaller than the rotors inertia which is found form the motor's specifications as JR=5.4x10-6 kgm2

The overall inertia for the motor-microscope coupled system is approximately J = 5.5x10-6 kgm²


Stepper Motor Sizing
Is the specific stepper motor appropriate for our task?
We have to take into consideration the following specifications :
  • The rotation of the shaft is very slow (approaching the speed that it is rotated by a human hand). n=10-20 rpm
  • The stepper motor shouldn’t lose any steps.
  • We should be able to reverse the speed vector quickly.
The torque-speed curve of the motor is shown below(for 1.2A, 14V) :
Project_20152016_MicroscopeCamera_3.jpg
Max torque: 0.238 Nm

The torque needed to accelerate our system it is definitely below the red curve so the stepper motor would not lose synchronism during acceleration due to system's inertia.

Our motor will operate in low speeds (10-20 rpm) so we can assume that this operating point is into the locked step mode, which means that it decelerates between each step having to ability to reverse its movement instantly. Operating into this range the stepper is not losing any steps .

For a basic inertial calculation we assume a max speed of 20 rpm = 2,09 rad/s. Our stepper motor has a step of 0.9° so we will sending
f = 2.09 rad * (180/π) / 0.9 = 133.05 Ηz or pulses per second. Assuming 1/3-1/3-1/3 accelerating-steady state-decelerating profile for each step the acceleration is calculated as:
2.09 rad/s / (0.333*(1/133.05)s) = 834 rad/s².

The needed torque using the inertia calculated above for this acceleration is :
Tin=4.5x10-3 Nm much smaller than the max holding torque of the motor so that the motor can accelerate and decelerate the load in the locked step region (the moment due to the mass of the stage is 10 times smaller than the moment required to accelerate the stepper so is not calculated). So we can assume that the inertial phenomena are negligent.

During the steady state motion we have to overcome the moments due to the weight of the stage and the friction loses. As we calculated above we can assume that this moment is about:
T = mgr + Bω = 1.5 10-3 Nm

Connecting the two shafts
In order to connect the rotor shaft with the microscope’s we will need a coupler (there is no need for an elastic coupler) with a 3mm hole and a 5mm hole. If such a coupler is not found, one with smaller holes can be modified to fit our situation.

A sketch of the coupling of the two shafts is the following:
Project_20152016_MicroscopeCamera_4.jpg

Such a coupler is commercial available (see the link below):
http://grobotronics.com/shaft-coupler-solid-3mm-3mm.html

Software Part: Overview
Our software design consists of a raspberry Pi micro-computer and an arduino Micro microcontroller. Using the Pi-camera (designed especially for raspberry Pi) we will continuously take pictures of the stage. The pictures will be stored to the Pi's storage. Raspberry Pi will act as a basic HTTP server (possibly using the apache 2.0 server) over the local network of the lab and will provide video streaming to each wireless device connected to it using the MJPEG (M stands for Moving) technology. To add more functionality the user of this interface can control the steppers which are used to control the microscope's stage motion by a simple HTML-php interface. The user's requests will be transmitted to the Arduino board via serial communication with the Pi using a USB cable. Moreover Pi will perform a basic image processing (i.e. determing if an image is blur) using the open-source vision library OpenCV 2.0 available for C++ and python developers. A schematic of the above description follows:

Project_20152016_MicroscopeCamera_Overview.jpg



Milestone 2: Detailed design

Hardware Part: Detailed Design

All the drawings we made can be downloaded with the following file:

2-D Drawings are also appended to the Report.pdf file


Motor attachment
The design of a small structure is needed in order to couple the shafts of the stepper motor and the microscope as described in the section above. On the bearing on which the microscope's shaft is based there are three holes with M3 thread as is shown below which can be used to fasten our structure to the microscope's body and achieve linearity between the two shafts.
Project_20152016_MicroscopeCamera_6.jpg
The following flange is mated with the microscopes body, taking advantage of the three M3 holes.

flange.JPG
The part shown below is fastened to the above flange using countersunk bolts. Three identical parts will be manufactured in a lathe using aluminium as material to ensure the strength of this structure and of course small displacements (see below for a simple static analysis).
Capture.JPG

In order to mate the motor with these cylindrical parts a flange is designed that matches the geometry of the motor’s front view (see motor specifications for a detailed drawing of the motor front face - technical documentation [2]).

motormount.JPG

The above part is fastened on the motor with 4 M3 bolts,as well as with the 3 cylinders using M3 countersunk bolts
(allen-type or screwdrived) normal to the longitudinal direction of each of these parts.

A static analysis is performed (using a simple FEM analysis in Solidworks) in order to determine the sturdiness and the bending deflection of this structure. Large bending deflections can cause loss of linearity between the two shafts.
The maximum load for each cylindrical part is estimated at 1.5 N and the material aluminium.
fem.JPG
Results:

The above figure shows that a deflection around 5 μm is calculated at the end of the part, which is acceptable regarding the manufacturing precision of each part.

A final drawing of the whole design (including the coupler) is following:
exploded.JPGassemblly.jpg


Important Note: In absence of exact measurements no dimensions added to the above drawings. More detailed drawings of the finally manufactured parts will be uploaded.

Pi Camera Basement

In order to mount the camera on the microscope a 3D printable basement was designed. A 3D drawing of this basement follows:

Project_20152016_MicroscopeCamera_Basement.jpg

Electronic circuit

The schematic of the electronic circuit that was designed for this project is the following:
Project_20152016_MicroscopeCamera_schematic.jpg
The components are:
  1. Arduino Micro
  2. Stepper motor
  3. Jack for external power supply
  4. 100 μF capacitor
  5. Stepper motor driver
  6. Joystick conected to analog input of arduino
  7. Indicating LED
  8. 220 Ω resistor
  9. 1 kΩ resistor
  10. Switch for initialization of the Z=0 position

A electronic drawing of this circuit has been appended to the Report.pdf file wich can be downladed above.


Software Part: Implementation


All the source files written in python and php/html can be downloaded from this site and are included in the Software.zip file following below. This file has a directory structure that is alike to the root structure of the Raspbian system. Copy and paste each file to the corresponding destination of your Raspberry Pi and the project is ready to run.


We are going to give some information about the software capabillities and structure. Please refer to Report.pdf file for supplementary and more specific information.

Taking pictures

A variety of solutions is available in order to make a video stream available on the local network. Possibly a pretty simple solution which does not require a lot of low-level coding and will moreover work well both for web browsers or self-programmed client applications for receiving the stream is using the Mjpeg technology, which in simple words consists in streaming a sequence of simple jpeg images with no need for video compression (assuming that we are not interested about transmitting sound).
A simple implementation of this technique is to continuously take a picture with the camera and store it to the same file each time using the jpeg encoding. A script watches when this file changes and since this occurs transmits the file via an http server, a WebSocket or using a more generic tcp/ip protocol (e.g. with python's in-built library Socket).
Thankfully there are some implementations of these technique available as open source applications, just like mjpeg-streamer.
http://sourceforge.net/projects/mjpg-streamer/

so streaming mjpeg after compiling and installing this application video is easy:
LD_LIBRARY_PATH=/usr/local/lib mjpg_streamer -i "input_file.so -f [directory containing the changing image] -n [changing image filename].jpg" -o "output_http.so -w /usr/local/www"

By default mjpg-streamer streams through the port 8080.

The only thing needed is to write a simple script which will read the camera at a preferred frame rate and save the picture over the same file. Such a script is easy to be written using several tools, but doing so via OpenCV 2 offers a plus advantage that we can program an image processing algorithm at the time the picture is taken and/or stream the processed image if we want to do so, too.

As we will use the Pi Camera a Python library such as picamera (actually picamera[array] as OpenCV reads picamera captures as numpy arrays) module (https://picamera.readthedocs.org/en/release-1.10/#) is needed to be installed to Raspberry Pi. The picamera module lets the user control a lot of parameters of the Pi Camera such as shutter speed, lightness mode etc.


Note1: The script for image acquisition is included in the Software.zip file that can be downloaded above, as all the following discribed scripts.
Note2: Links to some of the numerous tutorials about how to install all this software to Pi included in the web sites section.
Note3: By the time the mjpeg technology seems to be supported natively only by the Mozilla and Safari web browsers. A solution to support more browsers may be available via javascript.

The above software is starting during the boot of the Rapsberry Pi by editing the /etc/rc.local file and booting in text mode the RPi.

Raspberry Pi and Arduino micro communication

As briefly described in the overview section in order to control the stepper motor attached to the Arduino board, data transferring with the raspberry pi is needed. The most simple way to communicate these two boards is via serial communication using a usb cable. The arduino will act as a slave device waiting from RPi the appropriate command. On the Pi side a simple script written in Python using the Serial module will transfer these commands to arduino accordingly to user requests (Com.py script).

This script can be executed in command line using the python2 interpreter with the general form:
python [scriptname] -X[num1] -Y[num2] -Z[num3] -A[-1 or 0 or 1]
After each of the X,Y,Z arguments the number of steps follows as an integer.

After the A argument a boolean represents if we want auto focusing mode and writes that to serial so arduino can light a led which indicates that the microscope is in autofocus mode. To be more specific -1 stands for start the auto focus procedure form the current position, 1 to perform the auto focus procedure starting from the point where Z=0 and 0 indicates the termination of the auto focus procedure by turning the led off.

In the serial bus we simply pass as commands sequences like X50 which implies that the stepper motor should perform 50 steps or rotate 50*0.9=45 deg, so in fact the above script is just an echo script of the stdin to the serial bus, but it will be our basis for controlling the steppers using the web interface.

Arduino code

On the Arduino side the program that was written listens on the serial port for an incoming command of the above syntax and performs the requested action. Some extra features were added such as the manual control via a joystick, or the initialization of the program with the press of a button which will be positioned at the Z=0 point for the stage. This idea gives the opportunity to know the position of the stepper after each command (assuming that the stepper performs the given number of steps without losing some of them), so creating a maximum position after which the stage will not move upwards protecting the lenses. If we are going to overcome this position a led blinking inform us and no action is taken.

As the linear velocity of the stage is very small it could be a great waste of time waitng the stage to reach the Z=0 position, so the user might press the button to define a Z=0 plane of his own decision and remember to change the number of maximum allowed steps to protect the lenses and upload the new Arduino sketch to the Arduino board. This can be done by changing the maxAloowedPos constant of the Arduino program (more information on Report.pdf).

Please refer to the hardware section for the wiring of the arduino board.
The arduino program named as Final_Sketch.ino is included in the Software.zip file.


Verifying the correct serial port

A small python script lists all the serial devices of the Raspberry Pi and tries to connect by sending a predifined message (the charcater 'T') to each of them and waiting for 2 seconds. If the device replies with a predifined message (actually this message is simply an 'OK') this is the arduino device we are going to use. This feature is very useful if multiple arduino boards are connected to the Pi. The detected serial port is saved on the file serialPort.dat of the working directory.

Image processing and auto focus procedure

In order to implement a simple auto focus procedure, image processing to detect the blurriness of an image is needed. Many different algorithms are described in the literature either simple or more sophisticated. In our case a very basic algorithm to detect the blurriness in the whole image plane and retrieve a single value to determine if an image is blur or not was preferred for it's simplicity.

This algorithm consists on the variance of the Laplacian [2] of the whole image. The Laplacian image is the image that occurs after applying the Laplace operator on each point of a grayscale image, or convolving the image with a Laplace kernel like the following:

Project_20152016_MicroscopeCamera_lap_kernel.png


It is known from basic image processing that the Laplace operator is used to detect the edges in an image. An example of a picture and it's Laplacian follows verifying this statement.
Project_20152016_MicroscopeCamera_laplacian_img.png

The next step is to calculate the variance of the Laplacian image by using the following expression:
Project_20152016_MicroscopeCamera_expr.jpg

Thankfully this procedure takes only a line of code on OpenCV!
bv = cv2.Laplacian(gray, cv2.CV_64F).var()

So in an image with sharp edges (which is a more focused image) a greter variancle value is calculated than in pictures more defocused. In other words by using this algorithm our goal is to find the Z position where:
F = LAP_VAR = MAX

which implies to a classic optimization problem. Ideas on how to find this position are proposed to different papers where auto focus techniques are discussed. In our case a simple procedure of starting of a point and moving until the next image taken is found more blurred (so it's LAP_VAR is lower than the previous' image), where the stage will return to the previous position. This method is not very robust but it is simple and worked for this project.

For example looking at the following pictures the algorithm gave us the following results:

LAP_VAR = 32.032
Project_20152016_MicroscopeCamera_frame0.jpg

LAP_VAR = 33.780
Project_20152016_MicroscopeCamera_frame1.jpg

LAP_VAR = 40.698 = MAX --> On Focus
Project_20152016_MicroscopeCamera_frame3.jpg

LAP_VAR = 36.863
Project_20152016_MicroscopeCamera_frame4.jpg

Finding that in the 4th position the image has a smaller value of LAP_VAR than it' s previous (meaning that it is defocused) the algorithm stops and returns the stage to the previous position.


Web interface-Apache Server

As mentioned above a graphical interface was developed in order to control the stepper, watch the video stream and have access to the above python scripts. Such an interface can be developed in Python or C++ (using qt e.g.) and run locally on the raspberry pi, or can be an HTML5 form which will be handled with simple PhP run on the Pi which will actually run as an HTTP server

In this project a web interface was preferred, as it can be accessed by any wireless device connected to the local network of the lab. In addition this interface makes easier communicating on the client side (the wireless device), as a common browser will perform this task.

In order to run an HTTP server on Raspberry the open source Apache Server with the required modules for PhP was installed. For installation please refer on the Report.pdf file uploaded here or on Apache's official web page


This interface can be seen in the following image
Project_20152016_MicroscopeCamera_Interface.jpg

Except from the scripts listed above some extra options are available through this interface by using some simple Linux shell commands on the Raspberry Pi (like cp, zip and rm). To be more specific the user can save some frames on the local disk of the RPi (SD card) by copying the current frame.bmp image acquired by the image acquisition script form the /var/www/stream to the /var/www/html directory and download them in a zip archive (compressed with the Linux zip program). Finally it is recommended to clear this images from Pi's disk. As mentioned above the images are saved in uncompressed format (bmp) so there's no loss of data due to compression. The filenames of the images are of the form
d-m-y_H:i:s where
  1. d stands for day
  2. m stands for month etc...

More information can be found on Report.pdf file or by viewing the source file index.php included in the Software.zip file.

Note: Reverse Proxying the image stream

In order to embed the image stream to the interface reverse proxying the port 8080 to the port 80 (default port of Apache) is required. This can be done using the mod_proxy module of the Apache.

Note: User Permissions

The default user for apache in a unix-based system like Raspbian is the www-data user. In order to allow this user to run the python scripts for serial communication, www-data must be added to the dialout user group with the following command
sudo usermod -a -G dialout www-data

At this point it worths mentioning that accessing the RPi's GPIO pins root privileges are needed, so the www-data must be added to the root group too. However this is an obviously security flaw for the Raspberry Pi system and that's the main reason that an arduino board used to control the hardware (like the stepper, the joystick etc)

Note: Users and passwords

Apache server allows to create a user-password system to lock some features of the page. This could be done after discussion with the lab members on how many users must be created and the privileges of them to ensure that no unauthorized persons can control the microscope.


Milestone 3: Prototype fabrication

In order to reach a prototype fabrication, the following manufacturing processes were used:
1.Turning(Lathe)
Was used for the construction of the 3 cylindrical parts made of aluminium(outer diameter and length is achieved using turning).
2.Drilling
a) Adjustment of the inner diameter(half length) of the commercial coupler.
b) Hole making in the cylindrical parts.
3.Threading
Inner thread of the cylindrical parts.
4.Laser cutting
The two flanges(stainless steel) were manufactured using laser cutting(By SteelMax).
5.3D Printing
Printing of the plastic basement of the raspberry pi camera.

The electronic circuit was soldered on a board.


Final picture

Project_20152016_MicroscopeCamera_Final.JPG





References
Research Papers
1.http://www.ti.com/lit/an/slyt482/slyt482.pdf
2. Diatom autofocusing in bright eld microscopy: a comparative studyJ. L. Pech-Pacheco & G. Cristobal e.a.
3. Digital Autofocus Methods for Automated Microscopy By FEIMO SHEN, LOUIS HODGSON, and KLAUS HAHN
4. A Comparison of Different Focus Functions for Use in Autofocus Algorithms Frans C.A. Groen, Ian T. Young, and Guido Ligthart

Technical Documentation
1. http://labs.pbrc.edu/cellbiology/documents/AxioskopManual.pdf
2. http://en.nanotec.com/fileadmin/files/Baureihenuebersichten/Schrittmotoren/Product_Overview_ST4209.pdf


Web sites
1. https://www.youtube.com/watch?v=T8T6S5eFpqE
2. http://gr.rsdelivers.com/product/raspberry-pi/raspberry-pi-camera-board/raspberry-pi-camera-board-video-module/7757731.aspx?query=Raspberry+Pi+Camera+Board
3. https://www.ptgrey.com/flea2-08-mp-color-firewire-1394b-sony-icx204
4. http://web.mit.edu/2.75/fundamentals/FUNdaMENTALS.html (focus on chapters 3-7)
5. Install OpenCV & Python on raspberry:
http://www.pyimagesearch.com/2015/02/23/install-opencv-and-python-on-your-raspberry-pi-2-and-b/
6. Install picamera[array] for python
https://picamera.readthedocs.org/en/release-1.10/install2.html
7. Install mjpeg-streamer on rapsberry
http://blog.miguelgrinberg.com/post/how-to-build-and-run-mjpg-streamer-on-the-raspberry-pi
8. http://www.w3schools.com/


(c) 2015 Department of Mechanical Engineering, National Technical University of Athens.